Optimal Control of Crop Irrigation based on the Hamilton-Jacobi-Bellman Equation

نویسندگان

  • Jacques Ramanathan
  • Yuting Chen
  • Paul-Henry Cournède
چکیده

Water management in agriculture is a key issue due to the increasing problem of water scarcity worldwide. Based on the recent progress in the dynamic modeling of plant growth in interaction with the water resource, our objective is to study the optimal control problem of crop irrigation. For this purpose, we first describe the LNAS model for sugar beet growth, driving the dynamics of both plant biomass and soil water reserve. We then introduce the utility function corresponding to the farmer’s profit and derive the value function from the Hamilton-Jacobi-Bellman (HJB) equation. Then a backward finite-difference scheme is implemented to solve the HJB equation. It is proved to converge under a proper Courant-Friedrichs-Lewy condition for the discretization step. A few numerical simulations are provided to illustrate the resolution.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Near Optimal High Gain Controller For The Non-Minimum Phase Affine Nonlinear Systems

In this paper, a new analytical method to find a near-optimal high gain controller for the non-minimum phase affine nonlinear systems is introduced. This controller is derived based on the closed form solution of the Hamilton-Jacobi-Bellman (HJB) equation associated with the cheap control problem. This methodology employs an algebraic equation with parametric coefficients for the systems with s...

متن کامل

Hamilton-Jacobi-Bellman formalism for optimal climate control of greenhouse crop

The paper describes a simplified dynamic model of a greenhouse tomato crop, and the optimal control problem related to the seasonal benefit of the grower. A HJB formalism is used and the explicit form of the Krotov–Bellman function is obtained for different growth stages. Simulation results are shown. © 2009 Elsevier Ltd. All rights reserved.

متن کامل

Nonlinear Optimal Control Techniques Applied to a Launch Vehicle Autopilot

This paper presents an application of the nonlinear optimal control techniques to the design of launch vehicle autopilots. The optimal control is given by the solution to the Hamilton-Jacobi-Bellman (HJB) equation, which in this case cannot be solved explicity. A method based upon Successive Galerkin Approximation (SGA), is used to obtain an approximate optimal solution. Simulation results invo...

متن کامل

Discrete Hamilton-Jacobi Theory

We develop a discrete analogue of the Hamilton–Jacobi theory in the framework of the discrete Hamiltonian mechanics. We first reinterpret the discrete Hamilton–Jacobi equation derived by Elnatanov and Schiff in the language of discrete mechanics. The resulting discrete Hamilton– Jacobi equation is discrete only in time, and is shown to recover the Hamilton–Jacobi equation in the continuous-time...

متن کامل

Error estimation and adaptive discretization for the discrete stochastic Hamilton-Jacobi-Bellman equation

Generalizing an idea from deterministic optimal control, we construct a posteriori error estimates for the spatial discretization error of the stochastic dynamic programming method based on a discrete Hamilton–Jacobi–Bellman equation. These error estimates are shown to be efficient and reliable, furthermore, a priori bounds on the estimates depending on the regularity of the approximate solutio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013